Variational Bayes for generalized autoregressive models
نویسندگان
چکیده
We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models. The noise is modeled as a mixture of Gaussians rather than the usual single Gaussian. This allows different data points to be associated with different noise levels and effectively provides robust estimation of AR coefficients. The VB framework is used to prevent overfitting and provides model-order selection criteria both for AR order and noise model order. We show that for the special case of Gaussian noise and uninformative priors on the noise and weight precisions, the VB framework reduces to the Bayesian evidence framework. The algorithm is applied to synthetic and real data with encouraging results.
منابع مشابه
The Variational Em Algorithm for On-line Identification of Extended Ar Models
The AutoRegressive (AR) model is extended to cope with a wide class of possible transformations and degradations. The Variational Bayes (VB) procedure is used to restore conjugacy. The resulting Bayesian recursive identification procedure has many of the desirable computational properties of the classical RLS procedure. During each time-step, an iterative Variational EM (VEM) procedure is requi...
متن کاملApproximate Recursive Identification of Autoregressive Systems with Skewed Innovations
We propose a novel recursive system identification algorithm for linear autoregressive systems with skewed innovations. The algorithm is based on the variational Bayes approximation of the model with a multivariate normal prior for the model coefficients, multivariate skew-normally distributed innovations, and matrix-variatenormal–inverse-Wishart prior for the parameters of the innovation distr...
متن کاملNon-linear Bayesian prediction of generalized order statistics for liftime models
In this paper, we obtain Bayesian prediction intervals as well as Bayes predictive estimators under square error loss for generalized order statistics when the distribution of the underlying population belongs to a family which includes several important distributions.
متن کاملFrequentist Consistency of Variational Bayes
A key challenge for modern Bayesian statistics is how to perform scalable inference of posterior distributions. To address this challenge, variational Bayes (vb) methods have emerged as a popular alternative to the classical Markov chain Monte Carlo (mcmc) methods. vb methods tend to be faster while achieving comparable predictive performance. However, there are few theoretical results around v...
متن کاملRecurrent Gaussian Processes
We define Recurrent Gaussian Processes (RGP) models, a general family of Bayesian nonparametric models with recurrent GP priors which are able to learn dynamical patterns from sequential data. Similar to Recurrent Neural Networks (RNNs), RGPs can have different formulations for their internal states, distinct inference methods and be extended with deep structures. In such context, we propose a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Signal Processing
دوره 50 شماره
صفحات -
تاریخ انتشار 2002